Noise Injection Into Inputs In Sparsely Connected Hopfield And Winner-take-all Neural Networks - Systems, Man and Cybernetics, Part B, IEEE Transactions on

نویسنده

  • Lipo Wang
چکیده

In this paper, we show that noise injection into inputs in unsupervised learning neural networks does not improve their performance as it does in supervised learning neural networks. Specifically, we show that training noise degrades the classification ability of a sparsely connected version of the Hopfield neural network, whereas the performance of a sparsely connected winner-take-all neural network does not depend on the injected training noise.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Noise injection into inputs in sparsely connected Hopfield and winner-take-all neural networks

In this paper, we show that noise injection into inputs in unsupervised learning neural networks does not improve their performance as it does in supervised learning neural networks. Specifically, we show that training noise degrades the classification ability of a sparsely connected version of the Hopfield neural network, whereas the performance of a sparsely connected winner-take-all neural n...

متن کامل

Neural network approach for solving the maximal common subgraph problem

A new formulation of the maximal common subgraph problem (MCSP), that is implemented using a two-stage Hopfield neural network, is given. Relative merits of this proposed formulation, with respect to current neural network-based solutions as well as classical sequential-search-based solutions, are discussed.

متن کامل

Scheduling multiprocessor job with resource and timing constraints using neural networks

The Hopfield neural network is extensively applied to obtaining an optimal/feasible solution in many different applications such as the traveling salesman problem (TSP), a typical discrete combinatorial problem. Although providing rapid convergence to the solution, TSP frequently converges to a local minimum. Stochastic simulated annealing is a highly effective means of obtaining an optimal sol...

متن کامل

A reference model approach to stability analysis of neural networks

In this paper, a novel methodology called a reference model approach to stability analysis of neural networks is proposed. The core of the new approach is to study a neural network model with reference to other related models, so that different modeling approaches can be combinatively used and powerfully cross-fertilized. Focused on two representative neural network modeling approaches (the neu...

متن کامل

Group updates and multiscaling: an efficient neural network approach to combinatorial optimization

A multiscale method is described in the context of binary Hopfield-type neural networks. The appropriateness of the proposed technique for solving several classes of optimization problems is established by means of the notion of group update which is introduced here and investigated in relation to the properties of multiscaling. The method has been tested in the solution of partitioning and cov...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998